System Requirements
The more Gracenote data you license, the more objects/updates there will be to process. Programs are the largest entities and their overall number in the entitlement will contribute the most to the volume of downloaded data and daily updates. Notably, entitlements with the Program Database (vs Linear/Streaming only) will have a significant volume of Programs. Schedules are updated at least once daily per source (a new schedule day is added), and also provide updates to already delivered schedule days. Streaming catalogs vary in update activity, some might be churning more than others. Many updates are triggered by the schedule or catalog providers and not by Gracenote directly.
The following table lists a few basic metrics (total object count, total data size*, updates per day) for several sample API streams. While your specific stream numbers may differ, this should provide a rough guideline for appropriately sizing the system.
- Stream A: linear schedule for ~3k sources with referenced programs
- Stream B: linear schedule for ~1k sources, ~30 streaming catalogs, referenced programs
- Stream C: linear schedule for ~10k sources, ~15 streaming catalogs, referenced programs plus program database for US (description languages: en, es)
Endpoint |
|
Schedules |
Programs |
Availabilities |
|
|
|
|
|
|
|
||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|
Metric | obj count | data size* | upd/ day | obj count | data size* | upd/ day | obj count | data size* | upd/ day | obj count | data size* | upd/ day | |
Stream A | 3.2K | 13MB | 100 | 45K | 0.8GB | 17K | 1.25M | 12GB | 23K | ||||
Stream B | 0.9K | 4MB | 37 | 15K | 0.4GB | 7.5K | 2.3M | 39GB | 35K | 0.87M | 2.5GB | 29K | |
Stream C | 10K | 7MB | 230 | 194K | 3.1GB | 40K | 19.5M | 170GB | 100K | 1.37M | 3.3GB | 108K |
* measured on a Postgres DB with data ingested using Product Demonstration Kit (PDK)
Data for some endpoints is always provided in full regardless of the entitlements - the largest such endpoint being Celebrities (~2M objects, ~6GB data size, ~3.5K daily updates).
Especially during implementation, please consider storing the received XML responses locally instead of discarding them after ingestion. This helps troubleshooting and, if needed, allows rebuilding your database from scratch in a convenient and fast way (data download is typically the longest part of the ingest process).
Update Surges
Update surges can occur with or without announcement. Examples:
- A streaming catalog provider replaces all deep links for existing video assets in their catalog. Gracenote will process this and provide an according surge of updates in the Program Availabilities Endpoint endpoint. The actual number of updates will depend on the size of the catalog.
- New lineups/sources/catalogs are added to or removed from the client entitlement resulting in a surge of updates.
- Gracenote adds a new data element and reseeds the affected endpoint(s). This will typically be communicated to the clients beforehand via notifications.